Results

In total, we have 53 participants after excluding those for whom the fixation cross did not work (due to experimenter error) and 2 people for which the eye-tracker would not initially calibrate.

As preregistered, we excluded participants on the basis of time on task. Time on task was operationalized as the proportion of time during the trial where participants were fixating on either the pictures or the response scale. We then plotted boxplots and histograms of the mean proportion for each participant to determine where a natural break point would be (and to see if any outliers were detected).

Binary task

As can be seen from the boxplot and histogram, there was only one participant found to be an outlier (less than the first quartiler minus 1.5 times the interquartile range) in terms of time spent on task in the binary task.

Continuous task

As can be seen from the boxplot and histogram, there were no participants found to be outliers in the continuous task.

Therefore, we excluded 1 participants, resulting in 52 remaining participants, who completed all tasks.

Reaction times

Additionally, we excluded all trials with reaction times less than 200ms and trials where the reaction time is 3 standard deviations above the mean reaction time across all task trials (i.e.Ā not including the liking rating task).

This resulted in excluding 2.04% of trials in the continuous task, and 1.23% of trials in binary task. The maximum number of trials excluded for a single participant was 15% across the two tasks.

Analysis

Data exploration

ggpairs(data[task=="binary", c("attention.difference", "value.difference")],
        title="Correlations between predictors for binary task.")

cor.test(data$attention.difference[data$task=="binary"], data$value.difference[data$task=="binary"])
## 
##  Pearson's product-moment correlation
## 
## data:  data$attention.difference[data$task == "binary"] and data$value.difference[data$task == "binary"]
## t = 16.556, df = 2566, p-value < 2.2e-16
## alternative hypothesis: true correlation is not equal to 0
## 95 percent confidence interval:
##  0.2752894 0.3451936
## sample estimates:
##       cor 
## 0.3106615
ggpairs(data[task=="continuous", c("attention.difference", "value.difference")],
        title="Correlations between predictors for continuous task.")

cor.test(data$attention.difference[data$task=="continuous"], 
         data$value.difference[data$task=="continuous"])
## 
##  Pearson's product-moment correlation
## 
## data:  data$attention.difference[data$task == "continuous"] and data$value.difference[data$task == "continuous"]
## t = 18.174, df = 2545, p-value < 2.2e-16
## alternative hypothesis: true correlation is not equal to 0
## 95 percent confidence interval:
##  0.3040912 0.3728586
## sample estimates:
##       cor 
## 0.3389275
ggpairs(data[task!="valuation", c("attention.difference", "value.difference")],
        title="Correlations between predictors for both tasks.")

ggplot(data[task!="valuation"], aes(x = task, y = value.difference)) +
  stat_sum(aes(size = ..n.., group = 1)) +
  scale_size_area(max_size=10)

ggplot(data[task!="valuation"], aes(x = task, y = attention.difference)) +
  geom_jitter(alpha = .1) +
  geom_violin(alpha = .75) 

ggplot(data[task!="valuation"], aes(x=attention.difference, y = rt, fill=factor(task))) +
  geom_point() +
  facet_grid(. ~ task)

ggplot(data[task!="valuation"], aes(x=value.difference, y = rt, fill=factor(task))) +
  geom_point() +
  facet_grid(. ~ task)

ggplot(data[task!="valuation"], aes(x=factor(recodedResponse), y = attention.difference, fill=factor(recodedResponse))) +
  geom_boxplot()

ggplot(data[task!="valuation"], aes(x=factor(recodedResponse), y = value.difference, fill=factor(recodedResponse))) +
  geom_boxplot()

hist(data$response[data$task=="continuous"])

hist(abs(data$response[data$task=="continuous"]-0.5))

hist(data$response[data$task=="continuous" & data$taskOrder==1])

hist(data$response[data$task=="continuous" & data$taskOrder==2])

So, we were interested in whether the interaction effect was greater in the continuous task or in the binary task.

One intuition was that participants might take longer to think in the continuous task than in the binary task and by taking longer to consider their choices, we might get different choices?

rt <- lmer(rt ~ attention.difference*value.difference*task + 
             (1 + attention.difference*value.difference||participantNo), 
          data=data[task!="valuation"], REML=F, control=lmerControl(optimizer="Nelder_Mead"))
## boundary (singular) fit: see ?isSingular
summary(rt)
summary(rePCA(rt)) # 2/4 components have variance greater than 1%

# Removing components with 0 variance: value.difference
rt1 <- lmer(rt ~ attention.difference*value.difference*task + 
             (1 + attention.difference*value.difference - value.difference||participantNo), 
          data=data[task!="valuation"], REML=F, control=lmerControl(optimizer="Nelder_Mead"))
summary(rt1)
summary(rePCA(rt1)) # 2/3 have variance>1%
anova(rt, rt1) # N.s. prefer rt1

# Remove components with smallest variance: interaction
rt2 <- lmer(rt ~ attention.difference*value.difference*task + 
             (1 + attention.difference ||participantNo), 
          data=data[task!="valuation"], REML=F, control=lmerControl(optimizer="Nelder_Mead"))
summary(rt2)
anova(rt1, rt2) # N.s. prefer rt2

rt3 <- lmer(rt ~ attention.difference*value.difference*task + 
             (1|participantNo), 
          data=data[task!="valuation"], REML=F, control=lmerControl(optimizer="Nelder_Mead"))
summary(rt3)
anova(rt2, rt3) # N.s prefer rt3

rt4 <- lm(rt ~ attention.difference*value.difference*task , 
          data=data[task!="valuation"])
summary(rt4)
anova(rt3, rt4) # Sig: prefer rt3
summary(rt3)
## Linear mixed model fit by maximum likelihood  ['lmerMod']
## Formula: 
## rt ~ attention.difference * value.difference * task + (1 | participantNo)
##    Data: data[task != "valuation"]
## Control: lmerControl(optimizer = "Nelder_Mead")
## 
##      AIC      BIC   logLik deviance df.resid 
##  86502.9  86568.3 -43241.4  86482.9     5105 
## 
## Scaled residuals: 
##     Min      1Q  Median      3Q     Max 
## -2.7476 -0.6068 -0.1437  0.4100  5.5475 
## 
## Random effects:
##  Groups        Name        Variance Std.Dev.
##  participantNo (Intercept)  811427   900.8  
##  Residual                  1235886  1111.7  
## Number of obs: 5115, groups:  participantNo, 52
## 
## Fixed effects:
##                                                      Estimate Std. Error
## (Intercept)                                          2500.064    127.035
## attention.difference                                  -65.120     84.291
## value.difference                                       -1.856     10.763
## taskcontinuous                                        663.016     32.874
## attention.difference:value.difference                -178.467     38.468
## attention.difference:taskcontinuous                    69.555    122.382
## value.difference:taskcontinuous                         2.232     15.260
## attention.difference:value.difference:taskcontinuous  -60.022     54.609
##                                                      t value
## (Intercept)                                           19.680
## attention.difference                                  -0.773
## value.difference                                      -0.172
## taskcontinuous                                        20.169
## attention.difference:value.difference                 -4.639
## attention.difference:taskcontinuous                    0.568
## value.difference:taskcontinuous                        0.146
## attention.difference:value.difference:taskcontinuous  -1.099
## 
## Correlation of Fixed Effects:
##             (Intr) attnt. vl.dff tskcnt att.:. attn.: vl.df:
## attntn.dffr  0.004                                          
## valu.dffrnc  0.003 -0.316                                   
## taskcontins -0.127 -0.015 -0.011                            
## attntn.df:. -0.056  0.018 -0.011  0.213                     
## attntn.dff: -0.003 -0.689  0.219  0.049 -0.015              
## vl.dffrnc:t -0.002  0.223 -0.706 -0.008  0.007 -0.330       
## attntn.d:.:  0.039 -0.011  0.007 -0.320 -0.688 -0.009  0.026
models.binary <- data[task=="binary", 
               as.list(coef(lm(.SD$rt ~ .SD$attention.difference*.SD$value.difference))), 
               by=participantNo]
colnames(models.binary) <- c("participantNo", "intercept", "attention.difference", "value.difference", "interaction")

# 
# attention vs. attention (lm vs. glmer)
# 
# trial predictions, 
# 

ggplot() +
  scale_x_continuous(name="Attention", limits=c(-1, 1)) +
  scale_y_continuous(name="RT", limits=c(0, 6000)) +
  scale_linetype(name="s") +
  geom_abline(data=models.binary, 
              mapping=aes(slope=attention.difference, intercept=intercept, 
                          color=factor(participantNo)))

ggplot() +
  scale_x_continuous(name="Value", limits=c(-6, 6)) +
  scale_y_continuous(name="RT", limits=c(0, 6000)) +
  scale_linetype(name="s") +
  geom_abline(data=models.binary, 
              mapping=aes(slope=value.difference, intercept=intercept, 
                          color=factor(participantNo)))

models.continuous <- data[task=="continuous", 
               as.list(coef(lm(.SD$rt ~ .SD$attention.difference*.SD$value.difference))), 
               by=participantNo]
colnames(models.continuous) <- c("participantNo", "intercept", "attention.difference", 
                                 "value.difference", "interaction")

ggplot() +
  scale_x_continuous(name="Attention", limits=c(-1, 1)) +
  scale_y_continuous(name="RT", limits=c(0, 6000)) +
  scale_linetype(name="s") +
  geom_abline(data=models.continuous, 
              mapping=aes(slope=attention.difference, intercept=intercept, 
                          color=factor(participantNo)))

ggplot() +
  scale_x_continuous(name="Value", limits=c(-6, 6)) +
  scale_y_continuous(name="RT", limits=c(0, 6000)) +
  scale_linetype(name="s") +
  geom_abline(data=models.continuous, 
              mapping=aes(slope=value.difference, intercept=intercept, 
                          color=factor(participantNo)))

Choice

Another might be that choice was affected by task.

full <- glmer(recodedResponse ~ attention.difference*value.difference*task +
                (1 + attention.difference*value.difference||participantNo),
              data=data[task!="valuation",], family="binomial", 
              control=glmerControl(optimizer="Nelder_Mead"))
## Warning in checkConv(attr(opt, "derivs"), opt$par, ctrl =
## control$checkConv, : Model failed to converge with max|grad| = 0.0837133
## (tol = 0.001, component 1)
summary(full)
summary(rePCA(full)) # 2/4 components have variance>1%

# Removed component with smallest variance (interaction term)
choice1 <- glmer(recodedResponse ~ attention.difference*value.difference*task +
                (1 + attention.difference + value.difference||participantNo),
              data=data[task!="valuation",], family="binomial", 
              control=glmerControl(optimizer="Nelder_Mead"))
## Warning in checkConv(attr(opt, "derivs"), opt$par, ctrl =
## control$checkConv, : Model failed to converge with max|grad| = 0.00934545
## (tol = 0.001, component 1)
summary(choice1)
anova(full, choice1) # N.s so prefer choice1

choice2 <- glmer(recodedResponse ~ attention.difference*value.difference*task +
                (1 + attention.difference||participantNo),
              data=data[task!="valuation",], family="binomial", 
              control=glmerControl(optimizer="Nelder_Mead"))
summary(choice2)
anova(choice2, choice1) # Sig so prefer choice1

choice3 <- glmer(recodedResponse ~ attention.difference*value.difference*task +
                (1 + value.difference||participantNo),
              data=data[task!="valuation",], family="binomial", 
              control=glmerControl(optimizer="Nelder_Mead"))
summary(choice3)
anova(choice3, choice1) # Sig so prefer choice1
models.binary <- data[task=="binary", 
               as.list(coef(glm(.SD$recodedResponse ~ .SD$attention.difference*.SD$value.difference))), 
               by=participantNo]
colnames(models.binary) <- c("participantNo", "intercept", "attention.difference", "value.difference", "interaction")

ggplot() +
  scale_x_continuous(name="Attention", limits=c(-1, 1)) +
  scale_y_continuous(name="Choice", limits=c(0, 6000)) +
  scale_linetype(name="s") +
  geom_abline(data=models.binary, 
              mapping=aes(slope=attention.difference, intercept=intercept, 
                          color=factor(participantNo)))

ggplot() +
  scale_x_continuous(name="Choice", limits=c(-6, 6)) +
  scale_y_continuous(name="RT", limits=c(0, 6000)) +
  scale_linetype(name="s") +
  geom_abline(data=models.binary, 
              mapping=aes(slope=value.difference, intercept=intercept, 
                          color=factor(participantNo)))

models.continuous <- data[task=="continuous", 
               as.list(coef(lm(.SD$rt ~ .SD$attention.difference*.SD$value.difference))), 
               by=participantNo]
colnames(models.continuous) <- c("participantNo", "intercept", "attention.difference", 
                                 "value.difference", "interaction")

ggplot() +
  scale_x_continuous(name="Attention", limits=c(-1, 1)) +
  scale_y_continuous(name="Choice", limits=c(0, 6000)) +
  scale_linetype(name="s") +
  geom_abline(data=models.continuous, 
              mapping=aes(slope=attention.difference, intercept=intercept, 
                          color=factor(participantNo)))

ggplot() +
  scale_x_continuous(name="Value", limits=c(-6, 6)) +
  scale_y_continuous(name="RT", limits=c(0, 6000)) +
  scale_linetype(name="s") +
  geom_abline(data=models.continuous, 
              mapping=aes(slope=value.difference, intercept=intercept, 
                          color=factor(participantNo)))

# Removed component with smallest variance (interaction term)
choice <- glmer(recodedResponse ~ attention.difference*value.difference +
                (1 + attention.difference + value.difference||participantNo),
              data=data[task!="valuation",], family="binomial", 
              control=glmerControl(optimizer="Nelder_Mead"))
summary(choice)
## Generalized linear mixed model fit by maximum likelihood (Laplace
##   Approximation) [glmerMod]
##  Family: binomial  ( logit )
## Formula: recodedResponse ~ attention.difference * value.difference + (1 +  
##     attention.difference + value.difference || participantNo)
##    Data: data[task != "valuation", ]
## Control: glmerControl(optimizer = "Nelder_Mead")
## 
##      AIC      BIC   logLik deviance df.resid 
##   3780.0   3825.8  -1883.0   3766.0     5108 
## 
## Scaled residuals: 
##     Min      1Q  Median      3Q     Max 
## -8.2871 -0.3779 -0.0418  0.3763 11.2443 
## 
## Random effects:
##  Groups          Name                 Variance Std.Dev.
##  participantNo   (Intercept)          0.0518   0.2276  
##  participantNo.1 attention.difference 5.5254   2.3506  
##  participantNo.2 value.difference     0.1115   0.3340  
## Number of obs: 5115, groups:  participantNo, 52
## 
## Fixed effects:
##                                       Estimate Std. Error z value Pr(>|z|)
## (Intercept)                           0.009662   0.053290   0.181    0.856
## attention.difference                  5.938762   0.400668  14.822   <2e-16
## value.difference                      0.901925   0.056708  15.905   <2e-16
## attention.difference:value.difference 0.017726   0.129179   0.137    0.891
##                                          
## (Intercept)                              
## attention.difference                  ***
## value.difference                      ***
## attention.difference:value.difference    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Correlation of Fixed Effects:
##             (Intr) attnt. vl.dff
## attntn.dffr  0.029              
## valu.dffrnc  0.008  0.056       
## attntn.df:.  0.091 -0.017  0.019
models <- data[, as.list(coef(glm(.SD$recodedResponse ~ .SD$attention.difference*.SD$value.difference))), 
               by=participantNo]
colnames(models) <- c("participantNo", "interceptF", "attention.differenceF", "value.differenceF", 
                      "interactionF")

models <- cbind(models, coef(choice)$participantNo)

ggplot(models, aes(x=attention.differenceF, y=attention.difference)) + 
  geom_point() + 
  scale_x_continuous(name="Participant wise glm", limits=c(-1, 1.5)) +
  scale_y_continuous(name="GLMEMs", limits=c(-1, 10)) +
  ggtitle("Comparison of mixed effects models with individual fits (attention)") + 
  geom_abline(intecept=0, slope=1)
## Warning: Ignoring unknown parameters: intecept

ggplot(models, aes(x=value.differenceF, y=value.difference)) + 
  geom_point() + 
  scale_x_continuous(name="Participant wise glm", limits=c(-0.5, .5)) +
  scale_y_continuous(name="GLMEMs", limits=c(-.5, 1.5)) +
  ggtitle("Comparison of mixed effects models with individual fits (value)") + 
  geom_abline(intecept=0, slope=1)
## Warning: Ignoring unknown parameters: intecept

binary.full <- glmer(response ~ attention.difference*value.difference +
                       (1 + attention.difference*value.difference|| participantNo),
                     family="binomial", data=data[task=="binary",])
summary(binary.full)
## Generalized linear mixed model fit by maximum likelihood (Laplace
##   Approximation) [glmerMod]
##  Family: binomial  ( logit )
## Formula: 
## response ~ attention.difference * value.difference + (1 + attention.difference *  
##     value.difference || participantNo)
##    Data: data[task == "binary", ]
## 
##      AIC      BIC   logLik deviance df.resid 
##   1923.4   1970.2   -953.7   1907.4     2560 
## 
## Scaled residuals: 
##     Min      1Q  Median      3Q     Max 
## -5.4467 -0.3770 -0.0415  0.3695  6.4872 
## 
## Random effects:
##  Groups          Name                                  Variance  Std.Dev.
##  participantNo   (Intercept)                           4.709e-02 0.21700 
##  participantNo.1 attention.difference                  7.097e+00 2.66396 
##  participantNo.2 value.difference                      1.598e-01 0.39980 
##  participantNo.3 attention.difference:value.difference 8.410e-08 0.00029 
## Number of obs: 2568, groups:  participantNo, 52
## 
## Fixed effects:
##                                       Estimate Std. Error z value Pr(>|z|)
## (Intercept)                            0.02230    0.06874   0.324    0.746
## attention.difference                   5.86179    0.50266  11.661   <2e-16
## value.difference                       0.94526    0.07534  12.547   <2e-16
## attention.difference:value.difference  0.05310    0.17986   0.295    0.768
##                                          
## (Intercept)                              
## attention.difference                  ***
## value.difference                      ***
## attention.difference:value.difference    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Correlation of Fixed Effects:
##             (Intr) attnt. vl.dff
## attntn.dffr 0.034               
## valu.dffrnc 0.019  0.095        
## attntn.df:. 0.083  0.011  0.017
continuous.full.recoded <- glmer(recodedResponse ~ attention.difference*value.difference +
                       (1 + attention.difference*value.difference|| participantNo),
                     family="binomial", data=data[task=="continuous",])
summary(continuous.full.recoded)
## Generalized linear mixed model fit by maximum likelihood (Laplace
##   Approximation) [glmerMod]
##  Family: binomial  ( logit )
## Formula: recodedResponse ~ attention.difference * value.difference + (1 +  
##     attention.difference * value.difference || participantNo)
##    Data: data[task == "continuous", ]
## 
##      AIC      BIC   logLik deviance df.resid 
##   1898.6   1945.3   -941.3   1882.6     2539 
## 
## Scaled residuals: 
##     Min      1Q  Median      3Q     Max 
## -7.7933 -0.3850 -0.0390  0.3557 10.0295 
## 
## Random effects:
##  Groups          Name                                  Variance Std.Dev.
##  participantNo   (Intercept)                           0.09555  0.3091  
##  participantNo.1 attention.difference                  6.17968  2.4859  
##  participantNo.2 value.difference                      0.07244  0.2691  
##  participantNo.3 attention.difference:value.difference 0.20943  0.4576  
## Number of obs: 2547, groups:  participantNo, 52
## 
## Fixed effects:
##                                        Estimate Std. Error z value
## (Intercept)                           -0.005194   0.075729  -0.069
## attention.difference                   6.461078   0.508830  12.698
## value.difference                       0.886444   0.061164  14.493
## attention.difference:value.difference -0.032347   0.213701  -0.151
##                                       Pr(>|z|)    
## (Intercept)                              0.945    
## attention.difference                    <2e-16 ***
## value.difference                        <2e-16 ***
## attention.difference:value.difference    0.880    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Correlation of Fixed Effects:
##             (Intr) attnt. vl.dff
## attntn.dffr  0.038              
## valu.dffrnc  0.001  0.136       
## attntn.df:.  0.096 -0.050  0.035
continuous.full <- glmer(response ~ attention.difference*value.difference +
                       (1 + attention.difference*value.difference|| participantNo),
                     family="binomial", data=data[task=="continuous",])
## Warning in eval(family$initialize, rho): non-integer #successes in a
## binomial glm!
## boundary (singular) fit: see ?isSingular
summary(continuous.full)
## Generalized linear mixed model fit by maximum likelihood (Laplace
##   Approximation) [glmerMod]
##  Family: binomial  ( logit )
## Formula: 
## response ~ attention.difference * value.difference + (1 + attention.difference *  
##     value.difference || participantNo)
##    Data: data[task == "continuous", ]
## 
##      AIC      BIC   logLik deviance df.resid 
##   1951.7   1998.5   -967.9   1935.7     2539 
## 
## Scaled residuals: 
##      Min       1Q   Median       3Q      Max 
## -10.4895  -0.4125   0.0279   0.4882  12.4563 
## 
## Random effects:
##  Groups          Name                                  Variance  Std.Dev. 
##  participantNo   (Intercept)                           4.905e-09 7.004e-05
##  participantNo.1 attention.difference                  0.000e+00 0.000e+00
##  participantNo.2 value.difference                      0.000e+00 0.000e+00
##  participantNo.3 attention.difference:value.difference 1.302e-10 1.141e-05
## Number of obs: 2547, groups:  participantNo, 52
## 
## Fixed effects:
##                                        Estimate Std. Error z value
## (Intercept)                            0.005249   0.058032   0.090
## attention.difference                   5.745644   0.298015  19.280
## value.difference                       0.782948   0.038359  20.411
## attention.difference:value.difference -0.014018   0.177877  -0.079
##                                       Pr(>|z|)    
## (Intercept)                              0.928    
## attention.difference                    <2e-16 ***
## value.difference                        <2e-16 ***
## attention.difference:value.difference    0.937    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Correlation of Fixed Effects:
##             (Intr) attnt. vl.dff
## attntn.dffr  0.075              
## valu.dffrnc  0.005  0.189       
## attntn.df:.  0.172 -0.060  0.081
## convergence code: 0
## boundary (singular) fit: see ?isSingular
cont <- lmer(response ~ attention.difference^2 + (1+attention.difference||participantNo),
              data=data[task=="continuous"])
summary(cont)
## Linear mixed model fit by REML ['lmerMod']
## Formula: response ~ attention.difference^2 + ((1 | participantNo) + (0 +  
##     attention.difference | participantNo))
##    Data: data[task == "continuous"]
## 
## REML criterion at convergence: 999
## 
## Scaled residuals: 
##     Min      1Q  Median      3Q     Max 
## -2.8281 -0.7253 -0.0266  0.7287  3.2038 
## 
## Random effects:
##  Groups          Name                 Variance Std.Dev.
##  participantNo   (Intercept)          0.000409 0.02022 
##  participantNo.1 attention.difference 0.026886 0.16397 
##  Residual                             0.084672 0.29098 
## Number of obs: 2547, groups:  participantNo, 52
## 
## Fixed effects:
##                      Estimate Std. Error t value
## (Intercept)          0.503388   0.006477   77.72
## attention.difference 0.723409   0.032296   22.40
## 
## Correlation of Fixed Effects:
##             (Intr)
## attntn.dffr 0.044

So, it looks like there are main effects of attention and value on choice and main effects of task (as expected), an interaction of choice & attention and a hint of a three way interaction on rts.

Notes

So, in this analysis there is a 0.324 correlation between attention difference and value difference: participants seem to look at the higher value option for longer.

To do